[Salon] Disinformation dilemma: US hands are way dirty, too



Disinformation dilemma: US hands are way dirty, too

Disinformation dilemma: US hands are way dirty, too

As Biden cracks down on Russian interference in our elections, a look at what the Pentagon has been doing overseas

It’s no secret that the Biden administration has made fighting online disinformation a major priority. On Wednesday, it announced sweeping measures to secure the 2024 election from interference, including seizing internet domains and sanctioning Russian operatives.

Such anti-disinformation measures are not without controversy. Just last week, Meta founder Mark Zuckerberg claimed in a letter to Congress that in 2021 the U.S. government had pressured Facebook to censor certain Covid-19 posts in an effort to tamp down what it believed to be misinformation.

“In 2021, senior officials from the Biden administration, including the White House, repeatedly pressured our teams for months to censor certain covid-19 content, including humor and satire, and expressed a lot of frustration with our teams when we didn’t agree,” Zuckerberg wrote in the letter sent Sunday. “Ultimately, it was our decision whether or not to take content down.”

While Zuckerberg’s allegations have sparked major debate on the extent to which the government should regulate social media, there can be no doubt that the proliferation of disinformation particularly over social media is a real cause for concern. A recent World Economic Forum report went so far as to name misinformation and disinformation the single greatest threat to global stability for 2024-2025.

However, while the United States takes stringent efforts to combat disinformation, particularly from foreign sources like Russia and China, history shows that it plays by different rules itself. Indeed, the National Security State has at times shown a problematic tendency to dabble in the exact same kinds of tactics that they fight so vociferously from other governments.

In recent years the United States has made a number of forays into covert online influence operations. In 2011, there was Operation Earnest Voice, a military program using “sock puppets” (fake social media accounts) to spread pro-U.S. narratives.

Similar efforts persist to this day. In 2022, the Stanford Internet Observatory released a study of America-based social media sock puppets. It analyzed thousands of coordinated Facebook and Twitter posts targeting people in Russia, China and Iran. Many of these posts contained sensational rumors, like stories of Iranians stealing the organs of Afghan refugees. Some accounts also impersonated hardliners and criticized the Iranian government for being too moderate. Later investigations linked a number of those accounts to the Pentagon.

“The sock puppet accounts were kind of funny to look at because we are so used to analyzing pro-Kremlin sock puppets, so it was weird to see accounts pushing the opposite narrative,” Shelby Grossman, a staffer at the Internet Observatory and a member of the research team that published the paper, told Gizmodo in August 2022.

Official U.S. documents also suggest a growing willingness to use disinformation as a tool of psychological operations (PSYOPs). An October 2022 SOCOM (Special Operations Command) procurement document requested new tools for “influence operations, digital deception, communication disruption, and disinformation campaigns at the tactical edge and operational levels” as well as the same technology used to generate online deepfakes.

And, as recently as last month, pro-American messages tied to the U.S military were appearing in ads (in Arabic) on the dating app Tinder in Lebanon.

When it comes to the fight over online disinformation, the U.S. military appears increasingly comfortable with the tools of its adversaries.

In a globalized age, the potential blowback of such tactics is easy to imagine. A Facebook message intended for Iranians is just two clicks — translate and share — away from going viral in the United States. Modern history contains plenty of examples of government propaganda campaigns spreading out of control. A Reuters report in June revealed that the U.S. military was behind a covert anti-China vaccine campaign targeting the Philippines. According to the news organization:

It aimed to sow doubt about the safety and efficacy of vaccines and other life-saving aid that was being supplied by China, a Reuters investigation found. Through phony internet accounts meant to impersonate Filipinos, the military’s propaganda efforts morphed into an anti-vax campaign. Social media posts decried the quality of face masks, test kits and the first vaccine that would become available in the Philippines – China’s Sinovac inoculation.

Reuters identified at least 300 accounts on X, formerly Twitter, that matched descriptions shared by former U.S. military officials familiar with the Philippines operation. Almost all were created in the summer of 2020 and centered on the slogan #Chinaangvirus – Tagalog for China is the virus….Briefed on the Pentagon’s secret anti-vax campaign by Reuters, some American public health experts also condemned the program, saying it put civilians in jeopardy for potential geopolitical gain.

While the United States does have laws regulating foreign propaganda, they are showing their age. In 1972, amendments to the Smith-Mundt Act (the law enabling programs like Voice of America) banned the State Department from exposing Americans to propaganda meant for foreign audiences. Under Smith-Mundt, foreign influence operations could only happen in languages and locations inaccessible to Americans.

Restrictions aimed at the State Department, however, have no bearing on the Department of Defense, where most 21st century psy-op projects take place. The 2012 Smith-Mundt Modernization Act neutered restrictions on online messaging, and in 2019, Congress passed Section 1631, expanding the military’s power to engage in covert information operations.

Many experts have framed the above online psy-ops as a vital theater of competition between China, and Russia, each of which have their own centralized disinformation programs. As one senior official put it, “ceding an entire domain to an adversary would be unwise.”

Yet such concern over a ‘disinformation gap’ may be misled. While cyberspace remains important, it is far from clear that disinformation is even a useful tool. Foreign influence campaigns often struggle to bridge cultural divides. Many of the posts in the Stanford study, for instance, translated English literally and used American hashtags instead of the local language.

Worryingly, while such mistakes reduce impact on foreigners, they make false messages more accessible to a domestic audience. China’s vaunted “50-cent army” of online propagandists, for example, is by most accounts far better at controlling domestic discourse and attacking dissidents than at impersonating foreigners.

None of this is to say that disinformation is to be taken lightly. However, as a weapon it is harder to wield and quicker to produce blowback than many assume.The United States can do better than fight fire with fire.

As tempting as it may be to go back to the Smith-Mundt-style separation between foreign and domestic audiences, global media has changed too much to put the toothpaste back in the tube. The United States will have to chart a new path, perhaps by forswearing disinformation as a tool of foreign affairs and national security. Such a policy may sound impractical, but it is not dissimilar to the intelligence community’s “duty to warn” other countries, even rivals such as Russia and Iran, about terror threats.

As disinformation supplants terrorism on lists of global threats, it may be time to treat it with the same moral clarity.

In past decades, the United States sought to sing two songs at once, spreading propaganda abroad while preserving democracy at home. In the present globalized era, however, we can no longer have it both ways. We must either temper our foreign policy with our domestic values or risk our democracy being corroded by the very measures meant to protect it.



This archive was generated by a fusion of Pipermail (Mailman edition) and MHonArc.